1,301 research outputs found

    The Generalizability of Private Sector Research on Software Project Management in Two USAF Organizations: An Exploratory Study

    Get PDF
    Project managers typically set three success criteria for their projects: meet specifications, be on time, and be on budget. However, software projects frequently fail to meet these criteria. Software engineers, acquisition officers, and project managers have all studied this issue and made recommendations for achieving success. But most of this research in peer reviewed journals has focused on the private sector. Researchers have also identified software acquisitions as one of the major differences between the private sector and public sector MIS. This indicates that the elements for a successful software project in the public sector may be different from the private sector. Private sector project success depends on many elements. Three of them are user interaction with the project\u27s development, critical success factors, and how the project manager prioritizes the traditional success criteria. High user interaction causes high customer satisfaction, even when the traditional success criteria are not completely met. Critical success factors are those factors a project manager must properly handle to avoid failure. And priorities influence which success criteria the project manager will most likely succeed in meeting. Through a survey of software project managers at two USAF software development organizations, my research discovered the following: 1) Air Force software project managers\u27 top priority is fulfilling requirements, 2) User interaction during the software life cycle strongly influences user satisfaction with the final product, and 3) Air Force and private sector projects share many of the same critical success factors for nonweapon systems, but there are still some sharp differences

    Networking at NASA. Johnson Space Center

    Get PDF
    A series of viewgraphs on computer networks at the Johnson Space Center (JSC) are given. Topics covered include information resource management (IRM) at JSC, the IRM budget by NASA center, networks evolution, networking as a strategic tool, the Information Services Directorate charter, and SSC network requirements, challenges, and status

    Advanced software integration: The case for ITV facilities

    Get PDF
    The array of technologies and methodologies involved in the development and integration of avionics software has moved almost as rapidly as computer technology itself. Future avionics systems involve major advances and risks in the following areas: (1) Complexity; (2) Connectivity; (3) Security; (4) Duration; and (5) Software engineering. From an architectural standpoint, the systems will be much more distributed, involve session-based user interfaces, and have the layered architectures typified in the layers of abstraction concepts popular in networking. Typified in the NASA Space Station Freedom will be the highly distributed nature of software development itself. Systems composed of independent components developed in parallel must be bound by rigid standards and interfaces, the clean requirements and specifications. Avionics software provides a challenge in that it can not be flight tested until the first time it literally flies. It is the binding of requirements for such an integration environment into the advances and risks of future avionics systems that form the basis of the presented concept and the basic Integration, Test, and Verification concept within the development and integration life cycle of Space Station Mission and Avionics systems

    ALFA & 3D: integral field spectroscopy with adaptive optics

    Full text link
    One of the most important techniques for astrophysics with adaptive optics is the ability to do spectroscopy at diffraction limited scales. The extreme difficulty of positioning a faint target accurately on a very narrow slit can be avoided by using an integral field unit, which provides the added benefit of full spatial coverage. During 1998, working with ALFA and the 3D integral field spectrometer, we demonstrated the validity of this technique by extracting and distinguishing spectra from binary stars separated by only 0.26". The combination of ALFA & 3D is also ideally suited to imaging distant galaxies or the nuclei of nearby ones, as its field of view can be changed between 1.2"x1.2" and 4"x4", depending on the pixel scale chosen. In this contribution we present new results both on galactic targets, namely young stellar objects, as well as extra-galactic objects including a Seyfert and a starburst nucleus.Comment: SPIE meeting 4007 on Adaptive Optical Systems Technology, March 200

    Macromolecular Cryocrystallography

    Full text link

    Person-to-Person Lending: The Pursuit of (More) Competitive Credit Markets

    Get PDF
    Person-to-person lending (P2PL) on the Internet is a relatively new credit market. The success of these markets hinges on their ability to provide both borrowers and lenders the chance to improve on the opportunities available in traditional intermediated credit markets. In essence, P2PL must create a more competitive market. Empirical observations provide evidence that frictions exist in these markets, which generally move markets away from competitive outcomes. Currently, auctions are the most popular mechanism for P2PL. This paper develops and analyzes an equilibrium competing auction model of P2PL. Coordination frictions and the presence of non-creditworthy borrowers create an environment where many potentially productive transactions are not made and interest rate dispersion is observed. Additionally, if the market naturally segments into groups of similar borrowers then increased frictions in a segment may lead some portion of lenders to migrate to a different segment

    Experimental philosophy leading to a small scale digital data base of the conterminous United States for designing experiments with remotely sensed data

    Get PDF
    Research using satellite remotely sensed data, even within any single scientific discipline, often lacked a unifying principle or strategy with which to plan or integrate studies conducted over an area so large that exhaustive examination is infeasible, e.g., the U.S.A. However, such a series of studies would seem to be at the heart of what makes satellite remote sensing unique, that is the ability to select for study from among remotely sensed data sets distributed widely over the U.S., over time, where the resources do not exist to examine all of them. Using this philosophical underpinning and the concept of a unifying principle, an operational procedure for developing a sampling strategy and formal testable hypotheses was constructed. The procedure is applicable across disciplines, when the investigator restates the research question in symbolic form, i.e., quantifies it. The procedure is set within the statistical framework of general linear models. The dependent variable is any arbitrary function of remotely sensed data and the independent variables are values or levels of factors which represent regional climatic conditions and/or properties of the Earth's surface. These factors are operationally defined as maps from the U.S. National Atlas (U.S.G.S., 1970). Eighty-five maps from the National Atlas, representing climatic and surface attributes, were automated by point counting at an effective resolution of one observation every 17.6 km (11 miles) yielding 22,505 observations per map. The maps were registered to one another in a two step procedure producing a coarse, then fine scale registration. After registration, the maps were iteratively checked for errors using manual and automated procedures. The error free maps were annotated with identification and legend information and then stored as card images, one map to a file. A sampling design will be accomplished through a regionalization analysis of the National Atlas data base (presently being conducted). From this analysis a map of homogeneous regions of the U.S.A. will be created and samples (LANDSAT scenes) assigned by region

    Development of a bioavailability‐based risk assessment approach for nickel in freshwater sediments

    Full text link
    To assess nickel (Ni) toxicity and behavior in freshwater sediments, a large‐scale laboratory and field sediment testing program was conducted. The program used an integrative testing strategy to generate scientifically based threshold values for Ni in sediments and to develop integrated equilibrium partitioning‐based bioavailability models for assessing risks of Ni to benthic ecosystems. The sediment testing program was a multi‐institutional collaboration that involved extensive laboratory testing, field validation of laboratory findings, characterization of Ni behavior in natural and laboratory conditions, and examination of solid phase Ni speciation in sediments. The laboratory testing initiative was conducted in 3 phases to satisfy the following objectives: 1) evaluate various methods for spiking sediments with Ni to optimize the relevance of sediment Ni exposures; 2) generate reliable ecotoxicity data by conducting standardized chronic ecotoxicity tests using 9 benthic species in sediments with low and high Ni binding capacity; and, 3) examine sediment bioavailability relationships by conducting chronic ecotoxicity testing in sediments that showed broad ranges of acid volatile sulfides, organic C, and Fe. A subset of 6 Ni‐spiked sediments was deployed in the field to examine benthic colonization and community effects. The sediment testing program yielded a broad, high quality data set that was used to develop a Species Sensitivity Distribution for benthic organisms in various sediment types, a reasonable worst case predicted no‐effect concentration for Ni in sediment (PNECsediment), and predictive models for bioavailability and toxicity of Ni in freshwater sediments. A bioavailability‐based approach was developed using the ecotoxicity data and bioavailability models generated through the research program. The tiered approach can be used to fulfill the outstanding obligations under the European Union (EU) Existing Substances Risk Assessment, EU Registration, Evaluation, Authorisation, and Regulation of Chemicals (REACH), and other global regulatory initiatives. Integr Environ Assess Manag 2016;12:735–746. © 2015 SETACKey PointsA comprehensive, representative sediment toxicity database is available to support risk assessment of Ni in freshwater sediments.Sediment Ni ecotoxicity data were gathered from studies that used spiking approaches that resulted in Ni‐enriched sediments resembling naturally contaminated sediments, thus increasing their relevance.Bioavailability of Ni in sediments, which is controlled by acid volatile sulfides (AVS), varies among different species, with actively bioturbating species showing a lower slope in the relationship between decreasing toxicity with increasing AVS.A bioavailability‐based, tiered approach is presented, where the first tier involves comparison of ambient total Ni concentrations with a RWC threshold value of 136 mg Ni/kg. Site‐specific AVS can be used to calculate a site‐specific threshold if ambient Ni is greater than 136 mg Ni/kg.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/134197/1/ieam1720.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/134197/2/ieam1720_am.pd

    Effects of Spinal Fusion for Idiopathic Scoliosis on Lower Body Kinematics During Gait

    Get PDF
    Objectives The purpose of this study was to compare gait among patients with scoliosis undergoing posterior spinal fusion and instrumentation (PSFI) to typically developing subjects and determine if the location of the lowest instrumented vertebra impacted results. Summary of Background Data PSFI is the standard of care for correcting spine deformities, allowing the preservation of body equilibrium while maintaining as many mobile spinal segments as possible. The effect of surgery on joint motion distal to the spine must also be considered. Very few studies have addressed the effect of PSFI on activities such as walking and even fewer address how surgical choice of the lowest instrumented vertebra (LIV) influences possible motion reduction. Methods Individuals with scoliosis undergoing PSFI (n = 38) completed gait analysis preoperatively and at postoperative years 1 and 2 along with a control group (n = 24). Comparisons were made with the control group at each time point and between patients fused at L2 and above (L2+) versus L3 and below (L3–). Results The kinematic results of the AIS group showed some differences when compared to the Control Group, most notably decreased range of motion (ROM) in pelvic tilt and trunk lateral bending. When comparing the LIV groups, only minor differences were observed, and the results showed decreased coronal trunk and pelvis ROM at the one-year visit and decreased hip rotation ROM at the two-year visit in the L3– group. Conclusions Patients with AIS showed decreased ROM preoperatively with further decreases postoperatively. These changes remained relatively consistent following the two-year visit, indicating that most kinematic changes occurred in the first year following surgery. Limited functional differences between the two LIV groups may be due to the lack of full ROM used during normal gait, and future work could address tasks that use greater ROM

    Honest Majority Multi-Prover Interactive Arguments

    Get PDF
    Interactive arguments, and their (succinct) non-interactive and zero-knowledge counterparts, have seen growing deployment in real world applications in recent years. Unfortunately, for large and complex statements, concrete proof generation costs can still be quite expensive. While recent work has sought to solve this problem by outsourcing proof computation to a group of workers in a privacy preserving manner, current solutions still require each worker to do work on roughly the same order as a single-prover solution. We introduce the Honest Majority Multi-Prover (HMMP) model for interactive arguments. In these arguments, we distribute prover computation among MM collaborating, but distrusting, provers. All provers receive the same inputs and have no private inputs, and we allow any t<M/2t < M/2 provers to be statically corrupted before generation of public parameters, and all communication is done via an authenticated broadcast channel. In contrast with the recent works of Ozdemir and Boneh (USENIX \u2722) and Dayama et al. (PETS \u2722), we target prover efficiency over privacy. We show that: (1) any interactive argument where the prover computation is suitably divisible into MM sub-computations can be transformed into an interactive argument in the HMMP model; and (2) arguments that are obtained via compiling polynomial interactive oracle proofs with polynomial commitment schemes admit HMMP model constructions that experience a (roughly) 1/M1/M speedup over a single-prover solution. The transformation of (1) preserves computational (knowledge) soundness, zero-knowledge, and can be made non-interactive via the Fiat-Shamir transformation. The constructions of (2) showcase that there are efficiency gains in proof distribution when privacy is not a concern
    corecore